25 research outputs found

    Civil Service Workforce Market Supply and the Effect on the Cost Estimating Relationships (CERs) that may effect the Productivity Factors for Future NASA Missions

    Get PDF
    The upcoming retirement of the Baby Boomers on the horizon will leave a performance gap between younger generation (the future NASA decision makers) and the gray beards. This paper will reflect on the average age of workforce across NASA Centers, the Aerospace Industry and other Government Agencies, like DoD. This papers will dig into Productivity and Realization Factors and how they get applied to bimonthly (payroll data) for true FTE calculations that could be used at each of the NASA Centers and other business systems that are on the forefront in being implemented. This paper offers some comparative costs solutions, from simple - full time equivalent (FTE) cost estimating relationships CERs, to complex - CERs for monthly time-phasing activities for small research projects that start and get completed within a government fiscal year. This paper will present the results of a parametric study investigating the cost-effectiveness of different alternatives performance based cost estimating relationships (CERs) and how they get applied into the Center s forward pricing rate proposals (FPRP). True CERs based on the relationship of a younger aged workforce will have some effects on labor rates used in both commercial cost models and internal home-grown cost models which may impact the productivity factors for future NASA missions

    Detection of Earth-impacting asteroids with the next generation all-sky surveys

    Full text link
    We have performed a simulation of a next generation sky survey's (Pan-STARRS 1) efficiency for detecting Earth-impacting asteroids. The steady-state sky-plane distribution of the impactors long before impact is concentrated towards small solar elongations (Chesley and Spahr, 2004) but we find that there is interesting and potentially exploitable behavior in the sky-plane distribution in the months leading up to impact. The next generation surveys will find most of the dangerous impactors (>140m diameter) during their decade-long survey missions though there is the potential to miss difficult objects with long synodic periods appearing in the direction of the Sun, as well as objects with long orbital periods that spend much of their time far from the Sun and Earth. A space-based platform that can observe close to the Sun may be needed to identify many of the potential impactors that spend much of their time interior to the Earth's orbit. The next generation surveys have a good chance of imaging a bolide like 2008TC3 before it enters the atmosphere but the difficulty will lie in obtaining enough images in advance of impact to allow an accurate pre-impact orbit to be computed.Comment: 47 pages, 16 figures, 2 table

    Impact probability under aleatory and epistemic uncertainties

    Get PDF
    We present an approach to estimate an upper bound for the impact probability of a potentially hazardous asteroid when part of the force model depends on unknown parameters whose statistical distribution needs to be assumed. As case study, we consider Apophis' risk assessment for the 2036 and 2068 keyholes based on information available as of 2013. Within the framework of epistemic uncertainties, under the independence and non-correlation assumption, we assign parametric families of distributions to the physical properties of Apophis that define the Yarkovsky perturbation and in turn the future orbital evolution of the asteroid. We find IP ≤ 5 × 10 - 5 for the 2036 keyhole and IP ≤ 1.6 × 10 - 5 for the 2068 keyhole. These upper bounds are largely conservative choices due to the rather wide range of statistical distributions that we explored

    NEOSurvey 1: Initial Results from the Warm Spitzer Exploration Science Survey of Near-Earth Object Properties

    Get PDF
    Near Earth Objects (NEOs) are small Solar System bodies whose orbits bring them close to the Earth's orbit. We are carrying out a Warm Spitzer Cycle 11 Exploration Science program entitled NEOSurvey --- a fast and efficient flux-limited survey of 597 known NEOs in which we derive diameter and albedo for each target. The vast majority of our targets are too faint to be observed by NEOWISE, though a small sample has been or will be observed by both observatories, which allows for a cross-check of our mutual results. Our primary goal is to create a large and uniform catalog of NEO properties. We present here the first results from this new program: fluxes and derived diameters and albedos for 80 NEOs, together with a description of the overall program and approach, including several updates to our thermal model. The largest source of error in our diameter and albedo solutions, which derive from our single band thermal emission measurements, is uncertainty in eta, the beaming parameter used in our thermal modeling; for albedos, improvements in Solar System absolute magnitudes would also help significantly. All data and derived diameters and albedos from this entire program are being posted on a publicly accessible webpage at nearearthobjects.nau.edu .Comment: AJ in pres

    The Pan-STARRS Moving Object Processing System

    Full text link
    We describe the Pan-STARRS Moving Object Processing System (MOPS), a modern software package that produces automatic asteroid discoveries and identifications from catalogs of transient detections from next-generation astronomical survey telescopes. MOPS achieves > 99.5% efficiency in producing orbits from a synthetic but realistic population of asteroids whose measurements were simulated for a Pan-STARRS4-class telescope. Additionally, using a non-physical grid population, we demonstrate that MOPS can detect populations of currently unknown objects such as interstellar asteroids. MOPS has been adapted successfully to the prototype Pan-STARRS1 telescope despite differences in expected false detection rates, fill-factor loss and relatively sparse observing cadence compared to a hypothetical Pan-STARRS4 telescope and survey. MOPS remains >99.5% efficient at detecting objects on a single night but drops to 80% efficiency at producing orbits for objects detected on multiple nights. This loss is primarily due to configurable MOPS processing limits that are not yet tuned for the Pan-STARRS1 mission. The core MOPS software package is the product of more than 15 person-years of software development and incorporates countless additional years of effort in third-party software to perform lower-level functions such as spatial searching or orbit determination. We describe the high-level design of MOPS and essential subcomponents, the suitability of MOPS for other survey programs, and suggest a road map for future MOPS development.Comment: 57 Pages, 26 Figures, 13 Table

    LSST: from Science Drivers to Reference Design and Anticipated Data Products

    Get PDF
    (Abridged) We describe here the most ambitious survey currently planned in the optical, the Large Synoptic Survey Telescope (LSST). A vast array of science will be enabled by a single wide-deep-fast sky survey, and LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: probing dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. LSST will be a wide-field ground-based system sited at Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2^2 field of view, and a 3.2 Gigapixel camera. The standard observing sequence will consist of pairs of 15-second exposures in a given field, with two such visits in each pointing in a given night. With these repeats, the LSST system is capable of imaging about 10,000 square degrees of sky in a single filter in three nights. The typical 5σ\sigma point-source depth in a single visit in rr will be 24.5\sim 24.5 (AB). The project is in the construction phase and will begin regular survey operations by 2022. The survey area will be contained within 30,000 deg2^2 with δ<+34.5\delta<+34.5^\circ, and will be imaged multiple times in six bands, ugrizyugrizy, covering the wavelength range 320--1050 nm. About 90\% of the observing time will be devoted to a deep-wide-fast survey mode which will uniformly observe a 18,000 deg2^2 region about 800 times (summed over all six bands) during the anticipated 10 years of operations, and yield a coadded map to r27.5r\sim27.5. The remaining 10\% of the observing time will be allocated to projects such as a Very Deep and Fast time domain survey. The goal is to make LSST data products, including a relational database of about 32 trillion observations of 40 billion objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures available from https://www.lsst.org/overvie

    LSST Science Book, Version 2.0

    Get PDF
    A survey that can cover the sky in optical bands over wide fields to faint magnitudes with a fast cadence will enable many of the exciting science opportunities of the next decade. The Large Synoptic Survey Telescope (LSST) will have an effective aperture of 6.7 meters and an imaging camera with field of view of 9.6 deg^2, and will be devoted to a ten-year imaging survey over 20,000 deg^2 south of +15 deg. Each pointing will be imaged 2000 times with fifteen second exposures in six broad bands from 0.35 to 1.1 microns, to a total point-source depth of r~27.5. The LSST Science Book describes the basic parameters of the LSST hardware, software, and observing plans. The book discusses educational and outreach opportunities, then goes on to describe a broad range of science that LSST will revolutionize: mapping the inner and outer Solar System, stellar populations in the Milky Way and nearby galaxies, the structure of the Milky Way disk and halo and other objects in the Local Volume, transient and variable objects both at low and high redshift, and the properties of normal and active galaxies at low and high redshift. It then turns to far-field cosmological topics, exploring properties of supernovae to z~1, strong and weak lensing, the large-scale distribution of galaxies and baryon oscillations, and how these different probes may be combined to constrain cosmological models and the physics of dark energy.Comment: 596 pages. Also available at full resolution at http://www.lsst.org/lsst/sciboo

    Flatland: Rapid Prototyping of Distributed Internet Applications

    No full text
    Computer intra- and internets are widely used for clientserver application such as web browsers. With the exception of e-mail, however, the same networks are seldom used for distributed, client-client or client-serverclient applications. Such applications are difficult to develop and debug, and require a supporting infrastructure that is not readily available from existing systems. Flatland is a rapid prototyping environment that provides the underlying infrastructure and makes it easy to create and debug distributed internet application prototypes. In addition to the infrastructure needed for a distributed application, Flatland includes safe implementations of the most common sources of distributed application bugs – asynchronous operation and updating. Flatland also supports streaming audio-video and down-level clients
    corecore